Empirical Results on the Generalization Capabilities and Convergence Properties of the Bayes Point Machine

نویسنده

  • Sumit Basu
چکیده

We explore some of the convergence and generalization properties of the Bayes Point Machine (BPM) developed by Herbrich [3], an alternative to the Support Vector Machine (SVM) for classi cation. In the separable case, there are an in nite number of hyperplanes in version space that will perfectly separate the data. Instead of choosing a solution based on maximizing the margin (as with the SVM), the BPM seeks an approximation to the Bayes Point, the point in version space that is closest in behavior to the Bayes integral over all hyperplanes. Herbrich approximates this point using a stochastic algorithm for an arbitrary kernel (the billiard algorithm) which bounces a ball around the version space in order to estimate the Bayes Point. Because many kernels imply in nite dimensional feature spaces, it is interesting to investigate how long such an algorithm must run before it will converge. In a series of experiments on separable data (i.e., separable in the feature space), we thus test the BPM algorithm with a polynomial kernel (low-dimesional) and a RBF kernel (in nite dimensional). We compare generalization results with the SVM, investigate convergence rates, and examine the di erence between the BPM and SVM solutions under several data conditions. We nd that the BPM does converge rapidly and tightly even for in nite dimensional kernel, and that it has signi cantly better generalization performance only when the number of support vectors is of medium value with respect to the number of training points (i.e., more often with low-dimensional kernels). In addition, we augment Herbrich's discussion with some comments on the bias term, corrections to his pseudocode, and a MATLAB implementation of his algorithm to be made publicly available.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Invariant Empirical Bayes Confidence Interval for Mean Vector of Normal Distribution and its Generalization for Exponential Family

Based on a given Bayesian model of multivariate normal with  known variance matrix we will find an empirical Bayes confidence interval for the mean vector components which have normal distribution. We will find this empirical Bayes confidence interval as a conditional form on ancillary statistic. In both cases (i.e.  conditional and unconditional empirical Bayes confidence interval), the empiri...

متن کامل

Limiting Properties of Empirical Bayes Estimators in a Two-Factor Experiment under Inverse Gaussian Model

The empirical Bayes estimators of treatment effects in a factorial experiment were derived and their asymptotic properties were explored. It was shown that they were asymptotically optimal and the estimator of the scale parameter had a limiting gamma distribution while the estimators of the factor effects had a limiting multivariate normal distribution. A Bootstrap analysis was performed to ill...

متن کامل

On the approximation by Chlodowsky type generalization of (p,q)-Bernstein operators

In the present article, we introduce Chlodowsky variant of $(p,q)$-Bernstein operators and compute the moments for these operators which are used in proving our main results. Further, we study some approximation properties of these new operators, which include the rate of convergence using usual modulus of continuity and also the rate of convergence when the function $f$ belongs to the class Li...

متن کامل

Some Asymptotic Properties of the Support Vector Machine

The support vector machine methodology is a rapidly growing area of research in machine learning. A number of computational learning theoretical type results on the support vector machine have appeared in the machine learning literature. Typically the generalization error of the support vector machine is shown to be bounded by quantities that are related to the empirical margin of the training ...

متن کامل

Strong convergence results for fixed points of nearly weak uniformly L-Lipschitzian mappings of I-Dominated mappings

In this paper, we prove strong convergence results for a modified Mann iterative process for a new class of I- nearly weak uniformly L-Lipschitzian mappings in a real Banach space. The class of I-nearly weak uniformly L-Lipschitzian mappings is an interesting generalization of the class of nearly weak uniformly L-Lipschitzian mappings which inturn is a generalization of the class of nearly unif...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001